Consistency of the kernel density estimator - a survey
نویسندگان
چکیده
Various consistency proofs for the kernel density estimator have been developed over the last few decades. Important milestones are the pointwise consistency and almost sure uniform convergence with a fixed bandwidth on the one hand and the rate of convergence with a fixed or even a variable bandwidth on the other hand. While considering global properties of the empirical distribution functions is sufficient for strong consistency, proofs of exact convergence rates use deeper information about the underlying empirical processes. A unifying character, however, is that earlier and more recent proofs use bounds on the probability that a sum of random variables deviates from its mean.
منابع مشابه
Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data
Kernel density estimators are the basic tools for density estimation in non-parametric statistics. The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in which the bandwidth is varied depending on the location of the sample points. In this paper, we initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...
متن کاملSome Asymptotic Results of Kernel Density Estimator in Length-Biased Sampling
In this paper, we prove the strong uniform consistency and asymptotic normality of the kernel density estimator proposed by Jones [12] for length-biased data.The approach is based on the invariance principle for the empirical processes proved by Horváth [10]. All simulations are drawn for different cases to demonstrate both, consistency and asymptotic normality and the method is illustrated by ...
متن کاملComparison of the Gamma kernel and the orthogonal series methods of density estimation
The standard kernel density estimator suffers from a boundary bias issue for probability density function of distributions on the positive real line. The Gamma kernel estimators and orthogonal series estimators are two alternatives which are free of boundary bias. In this paper, a simulation study is conducted to compare small-sample performance of the Gamma kernel estimators and the orthog...
متن کاملA Berry-Esseen Type Bound for the Kernel Density Estimator of Length-Biased Data
Length-biased data are widely seen in applications. They are mostly applicable in epidemiological studies or survival analysis in medical researches. Here we aim to propose a Berry-Esseen type bound for the kernel density estimator of this kind of data.The rate of normal convergence in the proposed Berry-Esseen type theorem is shown to be O(n^(-1/6) ) modulo logarithmic term as n tends to infin...
متن کاملThe Relative Improvement of Bias Reduction in Density Estimator Using Geometric Extrapolated Kernel
One of a nonparametric procedures used to estimate densities is kernel method. In this paper, in order to reduce bias of kernel density estimation, methods such as usual kernel(UK), geometric extrapolation usual kernel(GEUK), a bias reduction kernel(BRK) and a geometric extrapolation bias reduction kernel(GEBRK) are introduced. Theoretical properties, including the selection of smoothness para...
متن کاملA Berry-Esseen Type Bound for a Smoothed Version of Grenander Estimator
In various statistical model, such as density estimation and estimation of regression curves or hazard rates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametric statistics is to estimate a monotone density function f on a compact interval. A known estimator for density function of f under the restriction that f is decreasing, is Grenander estimator, ...
متن کامل